Approximation of state-space trajectories by locally recurrent globally feed-forward neural networks

نویسنده

  • Krzysztof Patan
چکیده

The paper deals with investigating approximation abilities of a special class of discrete-time dynamic neural networks. The networks considered are called locally recurrent globally feed-forward, because they are designed with dynamic neuron models which contain inner feedbacks, but interconnections between neurons are strict feed-forward ones like in the well-known multi-layer perceptron. The paper presents analytical results showing that a locally recurrent network with two hidden layers is able to approximate a state-space trajectory produced by any Lipschitz continuous function with arbitrary accuracy. Moreover, based on these results, the network can be simplified and transformed into a more practical structure needed in real world applications.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

STRUCTURAL DAMAGE DETECTION BY MODEL UPDATING METHOD BASED ON CASCADE FEED-FORWARD NEURAL NETWORK AS AN EFFICIENT APPROXIMATION MECHANISM

Vibration based techniques of structural damage detection using model updating method, are computationally expensive for large-scale structures. In this study, after locating precisely the eventual damage of a structure using modal strain energy based index (MSEBI), To efficiently reduce the computational cost of model updating during the optimization process of damage severity detection, the M...

متن کامل

Numerical treatment for nonlinear steady flow of a third grade‎ fluid in a porous half space by neural networks optimized

In this paper‎, ‎steady flow of a third-grade fluid in a porous half‎ space has been considered‎. ‎This problem is a nonlinear two-point‎ boundary value problem (BVP) on semi-infinite interval‎. ‎The‎ solution for this problem is given by a numerical method based on the feed-forward artificial‎ neural network model using radial basis activation functions trained with an interior point method‎. ...

متن کامل

Globally Normalized Transition-Based Neural Networks

We introduce a globally normalized transition-based neural network model that achieves state-of-the-art part-ofspeech tagging, dependency parsing and sentence compression results. Our model is a simple feed-forward neural network that operates on a task-specific transition system, yet achieves comparable or better accuracies than recurrent models. The key insight is based on a novel proof illus...

متن کامل

Approximating the Semantics of Logic Programs by RecurrentNeural

; Abstract. In 8] we have shown how to construct a 3{layer recurrent neural network that computes the iteration of the meaning function T P of a given propositional logic program, what corresponds to the computation of the semantics of the program. In this article we deene a notion of approximation for interpretations and prove that there exists a 3{layer feed forward neural network that approx...

متن کامل

Dynamical multilayer neural networks that learn continuous trajectories

The feed-forward multilayer networks (perceptrons, radial basis function networks (RBF), probabilistic networks, etc.) are currently used as „static systems“ in pattern recognition, speech generation, identification and control, prediction, etc. (see, e. g. [1]). Theoretical works by several researchers, including [2] and [3] have proved that, even with one hidden layer, a perceptron neural net...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Neural networks : the official journal of the International Neural Network Society

دوره 21 1  شماره 

صفحات  -

تاریخ انتشار 2008